Goto

Collaborating Authors

 electricity consumption


Quantifying Climate Policy Action and Its Links to Development Outcomes: A Cross-National Data-Driven Analysis

Dutta, Aditi

arXiv.org Artificial Intelligence

Addressing climate change effectively requires more than cataloguing the number of policies in place; it calls for tools that can reveal their thematic priorities and their tangible impacts on development outcomes. Existing assessments often rely on qualitative descriptions or composite indices, which can mask crucial differences between key domains such as mitigation, adaptation, disaster risk management, and loss and damage. To bridge this gap, we develop a quantitative indicator of climate policy orientation by applying a multilingual transformer-based language model to official national policy documents, achieving a classification accuracy of 0.90 (F1-score). Linking these indicators with World Bank development data in panel regressions reveals that mitigation policies are associated with higher GDP and GNI; disaster risk management correlates with greater GNI and debt but reduced foreign direct investment; adaptation and loss and damage show limited measurable effects. This integrated NLP-econometric framework enables comparable, theme-specific analysis of climate governance, offering a scalable method to monitor progress, evaluate trade-offs, and align policy emphasis with development goals.


Using a swearword in your Google search can stop the AI answer. But should you?

The Guardian

Using a swearword in your Google search can stop the AI answer. Artificial intelligence is more than Trump deepfakes of Tilly the actor. It's used in smartphones, customer service, healthcare - even legal cases. Is it possible to avoid? Using a swearword in your Google search can stop that annoying AI overview from popping up.


Beyond Naïve Prompting: Strategies for Improved Zero-shot Context-aided Forecasting with LLMs

Ashok, Arjun, Williams, Andrew Robert, Zheng, Vincent Zhihao, Rish, Irina, Chapados, Nicolas, Marcotte, Étienne, Zantedeschi, Valentina, Drouin, Alexandre

arXiv.org Artificial Intelligence

Forecasting in real-world settings requires models to integrate not only historical data but also relevant contextual information, often available in textual form. While recent work has shown that large language models (LLMs) can be effective context-aided forecasters via naïve direct prompting, their full potential remains underexplored. We address this gap with 4 strategies, providing new insights into the zero-shot capabilities of LLMs in this setting. ReDP improves interpretability by eliciting explicit reasoning traces, allowing us to assess the model's reasoning over the context independently from its forecast accuracy. CorDP leverages LLMs solely to refine existing forecasts with context, enhancing their applicability in real-world forecasting pipelines. IC-DP proposes embedding historical examples of context-aided forecasting tasks in the prompt, substantially improving accuracy even for the largest models. Finally, RouteDP optimizes resource efficiency by using LLMs to estimate task difficulty, and routing the most challenging tasks to larger models. Evaluated on different kinds of context-aided forecasting tasks from the CiK benchmark, our strategies demonstrate distinct benefits over naïve prompting across LLMs of different sizes and families. These results open the door to further simple yet effective improvements in LLM-based context-aided forecasting. 1


Causal Machine Learning in IoT-based Engineering Problems: A Tool Comparison in the Case of Household Energy Consumption

Kosioris, Nikolaos-Lysias, Nikoletseas, Sotirios, Filios, Gavrilis, Panagiotou, Stefanos

arXiv.org Artificial Intelligence

The rapid increase in computing power and the ability to store Big Data in the infrastructure has enabled predictions in a large variety of domains by Machine Learning. However, in many cases, existing Machine Learning tools are considered insufficient or incorrect since they exploit only probabilistic dependencies rather than inference logic. Causal Machine Learning methods seem to close this gap. In this paper, two prevalent tools based on Causal Machine Learning methods are compared, as well as their mathematical underpinning background. The operation of the tools is demonstrated by examining their response to 18 queries, based on the IDEAL Household Energy Dataset, published by the University of Edinburgh. First, it was important to evaluate the causal relations assumption that allowed the use of this approach; this was based on the preexisting scientific knowledge of the domain and was implemented by use of the in-built validation tools. Results were encouraging and may easily be extended to other domains.


Google's emissions up 51% as AI electricity demand derails efforts to go green

The Guardian

Google's carbon emissions have soared by 51% since 2019 as artificial intelligence hampers the tech company's efforts to go green. While the corporation has invested in renewable energy and carbon removal technology, it has failed to curb its scope 3 emissions, which are those further down the supply chain, and are in large part influenced by a growth in datacentre capacity required to power artificial intelligence. The company reported a 27% increase in year-on-year electricity consumption as it struggles to decarbonise as quickly as its energy needs increase. Datacentres play a crucial role in training and operating the models that underpin AI models such as Google's Gemini and OpenAI's GPT-4, which powers the ChatGPT chatbot. The International Energy Agency estimates that datacentres' total electricity consumption could double from 2022 levels to 1,000TWh (terawatt hours) in 2026, approximately Japan's level of electricity demand.


Data Model Design for Explainable Machine Learning-based Electricity Applications

Fortuna, Carolina, Cerar, Gregor, Bertalanic, Blaz, Campa, Andrej, Mohorcic, Mihael

arXiv.org Artificial Intelligence

The transition from traditional power grids to smart grids, significant increase in the use of renewable energy sources, and soaring electricity prices has triggered a digital transformation of the energy infrastructure that enables new, data driven, applications often supported by machine learning models. However, the majority of the developed machine learning models rely on univariate data. To date, a structured study considering the role meta-data and additional measurements resulting in multivariate data is missing. In this paper we propose a taxonomy that identifies and structures various types of data related to energy applications. The taxonomy can be used to guide application specific data model development for training machine learning models. Focusing on a household electricity forecasting application, we validate the e ff ectiveness of the proposed taxonomy in guiding the selection of the features for various types of models. Finally, using a feature importance techniques, we explain individual feature contributions to the forecasting accuracy.1. Introduction The transition from traditional power grids to smart grids, significant increase in the use of renewable energy sources, and soaring electricity prices has led to an increase in complexity [1], particularly with the adoption of smart meters (SMs), energy management systems (EMSes), and intelligent electronic devices (IEDs) at the low voltage (L V) level. These devices enable innovative energy [2] and non-energy applications [3, 4], such as energy cost optimization and matching consumption with self-production from renewable energy sources. On the distribution system operator (DSO) side of the L V grid, reliability and latency are the main challenges, and complete ob-servability of the L V grid for each substation is crucial.


AI Is Eating Data Center Power Demand--and It's Only Getting Worse

WIRED

AI's energy use already represents as much as 20 percent of global data-center power demand, research published Thursday in the journal Joule shows. That demand from AI, the research states, could double by the end of this year, comprising nearly half of all total data-center electricity consumption worldwide, excluding the electricity used for bitcoin mining. The new research is published in a commentary by Alex de Vries-Gao, the founder of Digiconomist, a research company that evaluates the environmental impact of technology. De Vries-Gao started Digiconomist in the late 2010s to explore the impact of bitcoin mining, another extremely energy-intensive activity, would have on the environment. Looking at AI, he says, has grown more urgent over the past few years because of the widespread adoption of ChatGPT and other large language models that use massive amounts of energy. According to his research, worldwide AI energy demand is now set to surpass demand from bitcoin mining by the end of this year.


Global emissions due to AI-related chipmaking grew more than four times in 2024

Engadget

A pair of studies analyzing the effects of AI on our planet have been released and the news is fairly grim. Greenpeace studied the emissions generated from the production of the semiconductors used in AI chips and found that there was a fourfold increase in 2024. This analysis was completed using publicly available data. Many of the big chipmakers like NVIDIA rely on companies like Taiwan Semiconductor Manufacturing Co and SK Hynix Inc. for the components of GPUs and memory units. Most of this manufacturing happens in Taiwan, South Korea and Japan, where power grids are primarily reliant on fossil fuels.

  Country:
  Industry:

Holistically Evaluating the Environmental Impact of Creating Language Models

Morrison, Jacob, Na, Clara, Fernandez, Jared, Dettmers, Tim, Strubell, Emma, Dodge, Jesse

arXiv.org Artificial Intelligence

As the performance of artificial intelligence systems has dramatically increased, so too has the environmental impact of creating these systems. While many model developers release estimates of the power consumption and carbon emissions from the final training runs for their latest models, there is comparatively little transparency into the impact of model development, hardware manufacturing, and total water usage throughout. In this work, we estimate the real-world environmental impact of developing a series of language models, ranging from 20 million to 13 billion active parameters, trained on up to 5.6 trillion tokens each. When accounting for hardware manufacturing, model development, and our final training runs, we find that our series of models released 493 metric tons of carbon emissions, equivalent to powering about 98 homes in the United States for one year, and consumed 2.769 million liters of water, equivalent to about 24.5 years of water usage by a person in the United States, even though our data center is extremely water-efficient. We measure and report the environmental impact of our model development; to the best of our knowledge we are the first to do so for LLMs, and we find that model development, the impact of which is generally not disclosed by most model developers, amounted to 50% of that of training. By looking at detailed time series data for power consumption, we also find that power usage throughout training is not consistent, fluctuating between 15% and 85% of our hardware's maximum power draw, with negative implications for grid-scale planning as demand continues to grow. We close with a discussion on the continued difficulty of estimating the environmental impact of AI systems, and key takeaways for model developers and the public at large. In recent years, the field of artificial intelligence has progressed at an unprecedented pace, driven in large part by the development and deployment of large language and multimodal models.


The False AI Energy Crisis

The Atlantic - Technology

Over the past few weeks, Donald Trump has positioned himself as an unabashed bull on America's need to dominate AI. Yet the president has also tied this newfound and futuristic priority to a more traditional mission of his: to go big with fossil fuels. A true AI revolution will need "double the energy" that America produces today, Trump said in a recent address to the World Economic Forum, days after declaring a national energy emergency. And he noted a few ways to supply that power: "We have more coal than anybody. We also have more oil and gas than anybody."